A Modified Error Function to Improve the Error Back-Propagation Algorithm for Multi-Layer Perceptrons

نویسندگان

  • Sang-Hoon Oh
  • Youngjik Lee
چکیده

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Modeling of measurement error in refractive index determination of fuel cell using neural network and genetic algorithm

Abstract: In this paper, a method for determination of refractive index in membrane of fuel cell on basis of three-longitudinal-mode laser heterodyne interferometer is presented. The optical path difference between the target and reference paths is fixed and phase shift is then calculated in terms of refractive index shift. The measurement accuracy of this system is limited by nonlinearity erro...

متن کامل

Prospective Hardware Implementation of the Chir Neural Network Algorithm

I review the recently developed Choice of Internal Representations (CHIR) training algorithm for multi-layer perceptrons, with an emphasis on relevant properties for hardware implementation. A comparison to the common error back-propagation algorithm shows that there are potential advantages in realizing CHIR in hard-

متن کامل

Prediction of the Liquid Vapor Pressure Using the Artificial Neural Network-Group Contribution Method

In this paper, vapor pressure for pure compounds is estimated using the Artificial Neural Networks and a simple Group Contribution Method (ANN–GCM). For model comprehensiveness, materials were chosen from various families. Most of materials are from 12 families. Vapor pressure data of 100 compounds is used to train, validate and test the ANN-GCM model. Va...

متن کامل

Error back-propagation algorithm for classification of imbalanced data

Classification of imbalanced data is pervasive but it is a difficult problem to solve. In order to improve the classification of imbalanced data, this letter proposes a new error function for the error backpropagation algorithm of multilayer perceptrons. The error function intensifies weight-updating for the minority class and weakens weight-updating for the majority class. We verify the effect...

متن کامل

Deep Big Multilayer Perceptrons for Digit Recognition

The competitive MNIST handwritten digit recognition benchmark has a long history of broken records since 1998. The most recent advancement by others dates back 8 years (error rate 0.4%). Good old on-line back-propagation for plain multi-layer perceptrons yields a very low 0.35% error rate on the MNIST handwritten digits benchmark with a single MLP and 0.31% with a committee of seven MLP. All we...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1995